Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Oracle Cloud auth to the Vault Agent #19260

Merged
merged 3 commits into from
Mar 15, 2023

Conversation

F21
Copy link
Contributor

@F21 F21 commented Feb 20, 2023

This PR adds the OCI (Oracle Cloud Infrastructure) auto-auth method to the Vault Agent.
It supports authentication via API Keys and Instance Principals.

The end to end test must be executed in an OCI compute instances, as the OCI auth backend in Vault currently only supports Instance Principal authentication. See: https://developer.hashicorp.com/vault/docs/auth/oci#configure-the-oci-tenancy-to-run-vault

To run the tests, set the following environment variables:

  • OCI_TEST_TENANCY_OCID
  • OCI_TEST_USER_OCID
  • OCI_TEST_FINGERPRINT
  • OCI_TEST_PRIVATE_KEY_PATH
  • OCI_TEST_OCID_LIST

Test results:

$ make testacc TEST=./command/agent TESTARGS="-run=TestOCIEndToEnd"
==> Checking that build is using go version >= 1.20...
==> Using go version 1.20.1...
VAULT_ACC=1 go test -tags='' ./command/agent -v -run=TestOCIEndToEnd -timeout=60m
=== RUN   TestOCIEndToEnd
2023-02-20T14:04:41.452+1100 [DEBUG] core: set config: sanitized config="{\"api_addr\":\"\",\"cache_size\":0,\"cluster_addr\":\"\",\"cluster_cipher_suites\":\"\",\"cluster_name\":\"\",\"default_lease_ttl\":0,\"default_max_request_duration\":0,\"detect_deadlocks\":\"\",\"disable_cache\":false,\"disable_clustering\":false,\"disable_indexing\":false,\"disable_mlock\":false,\"disable_performance_standby\":false,\"disable_printable_check\":false,\"disable_sealwrap\":false,\"disable_sentinel_trace\":false,\"enable_response_header_hostname\":false,\"enable_response_header_raft_node_id\":false,\"enable_ui\":false,\"experiments\":null,\"introspection_endpoint\":false,\"log_format\":\"unspecified\",\"log_level\":\"\",\"log_requests_level\":\"\",\"max_lease_ttl\":0,\"pid_file\":\"\",\"plugin_directory\":\"\",\"plugin_file_permissions\":0,\"plugin_file_uid\":0,\"raw_storage_endpoint\":false}"
2023-02-20T14:04:41.452+1100 [DEBUG] storage.cache: creating LRU cache: size=0
2023-02-20T14:04:41.452+1100 [INFO]  core: Initializing version history cache for core
2023-02-20T14:04:41.452+1100 [DEBUG] core: set config: sanitized config="{\"api_addr\":\"\",\"cache_size\":0,\"cluster_addr\":\"\",\"cluster_cipher_suites\":\"\",\"cluster_name\":\"\",\"default_lease_ttl\":0,\"default_max_request_duration\":0,\"detect_deadlocks\":\"\",\"disable_cache\":false,\"disable_clustering\":false,\"disable_indexing\":false,\"disable_mlock\":false,\"disable_performance_standby\":false,\"disable_printable_check\":false,\"disable_sealwrap\":false,\"disable_sentinel_trace\":false,\"enable_response_header_hostname\":false,\"enable_response_header_raft_node_id\":false,\"enable_ui\":false,\"experiments\":null,\"introspection_endpoint\":false,\"log_format\":\"unspecified\",\"log_level\":\"\",\"log_requests_level\":\"\",\"max_lease_ttl\":0,\"pid_file\":\"\",\"plugin_directory\":\"\",\"plugin_file_permissions\":0,\"plugin_file_uid\":0,\"raw_storage_endpoint\":false}"
2023-02-20T14:04:41.452+1100 [DEBUG] storage.cache: creating LRU cache: size=0
2023-02-20T14:04:41.452+1100 [INFO]  core: Initializing version history cache for core
2023-02-20T14:04:41.452+1100 [DEBUG] core: set config: sanitized config="{\"api_addr\":\"\",\"cache_size\":0,\"cluster_addr\":\"\",\"cluster_cipher_suites\":\"\",\"cluster_name\":\"\",\"default_lease_ttl\":0,\"default_max_request_duration\":0,\"detect_deadlocks\":\"\",\"disable_cache\":false,\"disable_clustering\":false,\"disable_indexing\":false,\"disable_mlock\":false,\"disable_performance_standby\":false,\"disable_printable_check\":false,\"disable_sealwrap\":false,\"disable_sentinel_trace\":false,\"enable_response_header_hostname\":false,\"enable_response_header_raft_node_id\":false,\"enable_ui\":false,\"experiments\":null,\"introspection_endpoint\":false,\"log_format\":\"unspecified\",\"log_level\":\"\",\"log_requests_level\":\"\",\"max_lease_ttl\":0,\"pid_file\":\"\",\"plugin_directory\":\"\",\"plugin_file_permissions\":0,\"plugin_file_uid\":0,\"raw_storage_endpoint\":false}"
2023-02-20T14:04:41.452+1100 [DEBUG] storage.cache: creating LRU cache: size=0
2023-02-20T14:04:41.452+1100 [INFO]  core: Initializing version history cache for core
2023-02-20T14:04:41.453+1100 [INFO]  core: assigning cluster listener for test core: core=0 port=0
2023-02-20T14:04:41.453+1100 [INFO]  core: assigning cluster listener for test core: core=1 port=0
2023-02-20T14:04:41.453+1100 [INFO]  core: assigning cluster listener for test core: core=2 port=0
2023-02-20T14:04:41.453+1100 [INFO]  core: security barrier not initialized
2023-02-20T14:04:41.453+1100 [INFO]  core: security barrier initialized: stored=1 shares=3 threshold=3
2023-02-20T14:04:41.453+1100 [DEBUG] core: cluster name set: name=TestOCIEndToEnd
2023-02-20T14:04:41.453+1100 [DEBUG] core: cluster ID not found, generating new
2023-02-20T14:04:41.454+1100 [DEBUG] core: cluster ID set: id=eb403e83-7ccd-598c-a588-5a797f9f487b
2023-02-20T14:04:41.454+1100 [DEBUG] core: generating cluster private key
2023-02-20T14:04:41.459+1100 [DEBUG] core: generating local cluster certificate: host=fw-3ff7960e-5f03-271a-4513-33d0af917583
2023-02-20T14:04:41.461+1100 [INFO]  core: post-unseal setup starting
2023-02-20T14:04:41.461+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:41.461+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:41.461+1100 [DEBUG] core: persisting feature flags
2023-02-20T14:04:41.461+1100 [INFO]  core: loaded wrapping token key
2023-02-20T14:04:41.461+1100 [INFO]  core: successfully setup plugin catalog: plugin-directory=""
2023-02-20T14:04:41.461+1100 [INFO]  core: no mounts; adding default mount table
2023-02-20T14:04:41.462+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.462+1100 [INFO]  core: successfully mounted: type=cubbyhole version="v1.14.0+builtin.vault" path=cubbyhole/ namespace="ID: root. Path: "
2023-02-20T14:04:41.462+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.462+1100 [INFO]  core: successfully mounted: type=system version="v1.14.0+builtin.vault" path=sys/ namespace="ID: root. Path: "
2023-02-20T14:04:41.462+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.462+1100 [INFO]  core: successfully mounted: type=identity version="v1.14.0+builtin.vault" path=identity/ namespace="ID: root. Path: "
2023-02-20T14:04:41.463+1100 [TRACE] token: no token generation counter found in storage
2023-02-20T14:04:41.463+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.463+1100 [INFO]  core: successfully mounted: type=token version="v1.14.0+builtin.vault" path=token/ namespace="ID: root. Path: "
2023-02-20T14:04:41.463+1100 [TRACE] expiration.job-manager: created dispatcher: name=expire-dispatcher num_workers=10
2023-02-20T14:04:41.463+1100 [INFO]  rollback: starting rollback manager
2023-02-20T14:04:41.463+1100 [TRACE] expiration.job-manager: initialized dispatcher: num_workers=10
2023-02-20T14:04:41.463+1100 [TRACE] expiration.job-manager: created job manager: name=expire pool_size=10
2023-02-20T14:04:41.463+1100 [TRACE] expiration.job-manager: starting job manager: name=expire
2023-02-20T14:04:41.463+1100 [TRACE] expiration.job-manager: starting dispatcher
2023-02-20T14:04:41.463+1100 [INFO]  core: restoring leases
2023-02-20T14:04:41.463+1100 [DEBUG] expiration: collecting leases
2023-02-20T14:04:41.463+1100 [DEBUG] expiration: leases collected: num_existing=0
2023-02-20T14:04:41.463+1100 [INFO]  expiration: lease restore complete
2023-02-20T14:04:41.463+1100 [DEBUG] identity: loading entities
2023-02-20T14:04:41.463+1100 [DEBUG] identity: entities collected: num_existing=0
2023-02-20T14:04:41.463+1100 [INFO]  identity: entities restored
2023-02-20T14:04:41.463+1100 [DEBUG] identity: identity loading groups
2023-02-20T14:04:41.463+1100 [DEBUG] identity: groups collected: num_existing=0
2023-02-20T14:04:41.463+1100 [INFO]  identity: groups restored
2023-02-20T14:04:41.463+1100 [DEBUG] identity: identity loading OIDC clients
2023-02-20T14:04:41.463+1100 [TRACE] mfa: loading login MFA configurations
2023-02-20T14:04:41.463+1100 [TRACE] mfa: methods collected: num_existing=0
2023-02-20T14:04:41.463+1100 [TRACE] mfa: configurations restored: namespace="" prefix=login-mfa/method/
2023-02-20T14:04:41.463+1100 [TRACE] mfa: loading login MFA enforcement configurations
2023-02-20T14:04:41.463+1100 [TRACE] mfa: enforcements configs collected: num_existing=0
2023-02-20T14:04:41.463+1100 [TRACE] mfa: enforcement configurations restored: namespace="" prefix=login-mfa/enforcement/
2023-02-20T14:04:41.463+1100 [TRACE] activity: scanned existing logs: out=[]
2023-02-20T14:04:41.463+1100 [INFO]  core: Recorded vault version: vault version=1.14.0 upgrade time="2023-02-20 03:04:41.46374866 +0000 UTC" build date=""
2023-02-20T14:04:41.464+1100 [TRACE] activity: no intent log found
2023-02-20T14:04:41.464+1100 [TRACE] activity: scanned existing logs: out=[]
2023-02-20T14:04:41.464+1100 [INFO]  core: usage gauge collection is disabled
2023-02-20T14:04:41.464+1100 [DEBUG] secrets.identity.identity_a321aa69: wrote OIDC default provider
2023-02-20T14:04:41.507+1100 [DEBUG] secrets.identity.identity_a321aa69: generated OIDC public key to sign JWTs: key_id=6e4a015f-25b6-719d-967a-d5bcc8878f6e
2023-02-20T14:04:41.524+1100 [DEBUG] secrets.identity.identity_a321aa69: generated OIDC public key for future use: key_id=2267ed1b-c613-139d-d14a-35e526b74fea
2023-02-20T14:04:41.524+1100 [DEBUG] secrets.identity.identity_a321aa69: wrote OIDC default key
2023-02-20T14:04:41.524+1100 [DEBUG] secrets.identity.identity_a321aa69: wrote OIDC allow_all assignment
2023-02-20T14:04:41.524+1100 [INFO]  core: post-unseal setup complete
2023-02-20T14:04:41.525+1100 [DEBUG] token: no wal state found when generating token
2023-02-20T14:04:41.525+1100 [INFO]  core: root token generated
2023-02-20T14:04:41.525+1100 [INFO]  core: pre-seal teardown starting
2023-02-20T14:04:41.525+1100 [DEBUG] expiration: stop triggered
2023-02-20T14:04:41.525+1100 [TRACE] expiration.job-manager: terminating job manager...
2023-02-20T14:04:41.525+1100 [TRACE] expiration.job-manager: terminating dispatcher
2023-02-20T14:04:41.525+1100 [DEBUG] expiration: finished stopping
2023-02-20T14:04:41.525+1100 [INFO]  rollback: stopping rollback manager
2023-02-20T14:04:41.525+1100 [INFO]  core: pre-seal teardown complete
2023-02-20T14:04:41.525+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.525+1100 [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=6d4d86d6-bcb9-cbbf-e366-aaf66662b018
2023-02-20T14:04:41.525+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.525+1100 [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=6d4d86d6-bcb9-cbbf-e366-aaf66662b018
2023-02-20T14:04:41.525+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.525+1100 [DEBUG] core: starting cluster listeners
2023-02-20T14:04:41.525+1100 [INFO]  core.cluster-listener.tcp: starting listener: listener_address=127.0.0.1:0
2023-02-20T14:04:41.525+1100 [INFO]  core.cluster-listener: serving cluster requests: cluster_listen_address=127.0.0.1:40031
2023-02-20T14:04:41.525+1100 [INFO]  core: vault is unsealed
2023-02-20T14:04:41.525+1100 [INFO]  core: entering standby mode
2023-02-20T14:04:41.525+1100 [INFO]  core: acquired lock, enabling active operation
2023-02-20T14:04:41.525+1100 [DEBUG] core: generating cluster private key
2023-02-20T14:04:41.526+1100 [DEBUG] core: generating local cluster certificate: host=fw-dd8b793d-ebb6-b75c-3aa3-b377ece64937
2023-02-20T14:04:41.528+1100 [INFO]  core: post-unseal setup starting
2023-02-20T14:04:41.528+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:41.528+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:41.528+1100 [DEBUG] core: persisting feature flags
2023-02-20T14:04:41.528+1100 [INFO]  core: loaded wrapping token key
2023-02-20T14:04:41.528+1100 [INFO]  core: successfully setup plugin catalog: plugin-directory=""
2023-02-20T14:04:41.528+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.528+1100 [INFO]  core: successfully mounted: type=system version="v1.14.0+builtin.vault" path=sys/ namespace="ID: root. Path: "
2023-02-20T14:04:41.528+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.528+1100 [INFO]  core: successfully mounted: type=identity version="v1.14.0+builtin.vault" path=identity/ namespace="ID: root. Path: "
2023-02-20T14:04:41.528+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.528+1100 [INFO]  core: successfully mounted: type=cubbyhole version="v1.14.0+builtin.vault" path=cubbyhole/ namespace="ID: root. Path: "
2023-02-20T14:04:41.529+1100 [TRACE] token: no token generation counter found in storage
2023-02-20T14:04:41.529+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.529+1100 [INFO]  core: successfully mounted: type=token version="v1.14.0+builtin.vault" path=token/ namespace="ID: root. Path: "
2023-02-20T14:04:41.529+1100 [TRACE] expiration.job-manager: created dispatcher: name=expire-dispatcher num_workers=10
2023-02-20T14:04:41.529+1100 [TRACE] expiration.job-manager: initialized dispatcher: num_workers=10
2023-02-20T14:04:41.529+1100 [TRACE] expiration.job-manager: created job manager: name=expire pool_size=10
2023-02-20T14:04:41.529+1100 [TRACE] expiration.job-manager: starting job manager: name=expire
2023-02-20T14:04:41.529+1100 [TRACE] expiration.job-manager: starting dispatcher
2023-02-20T14:04:41.529+1100 [INFO]  core: restoring leases
2023-02-20T14:04:41.529+1100 [DEBUG] identity: loading entities
2023-02-20T14:04:41.529+1100 [DEBUG] expiration: collecting leases
2023-02-20T14:04:41.529+1100 [DEBUG] expiration: leases collected: num_existing=0
2023-02-20T14:04:41.529+1100 [DEBUG] identity: entities collected: num_existing=0
2023-02-20T14:04:41.529+1100 [INFO]  rollback: starting rollback manager
2023-02-20T14:04:41.529+1100 [INFO]  identity: entities restored
2023-02-20T14:04:41.529+1100 [DEBUG] identity: identity loading groups
2023-02-20T14:04:41.529+1100 [INFO]  expiration: lease restore complete
2023-02-20T14:04:41.529+1100 [DEBUG] identity: groups collected: num_existing=0
2023-02-20T14:04:41.529+1100 [INFO]  identity: groups restored
2023-02-20T14:04:41.529+1100 [DEBUG] identity: identity loading OIDC clients
2023-02-20T14:04:41.529+1100 [TRACE] mfa: loading login MFA configurations
2023-02-20T14:04:41.529+1100 [TRACE] mfa: methods collected: num_existing=0
2023-02-20T14:04:41.529+1100 [TRACE] mfa: configurations restored: namespace="" prefix=login-mfa/method/
2023-02-20T14:04:41.529+1100 [TRACE] mfa: loading login MFA enforcement configurations
2023-02-20T14:04:41.529+1100 [TRACE] mfa: enforcements configs collected: num_existing=0
2023-02-20T14:04:41.529+1100 [TRACE] mfa: enforcement configurations restored: namespace="" prefix=login-mfa/enforcement/
2023-02-20T14:04:41.529+1100 [TRACE] activity: scanned existing logs: out=[]
2023-02-20T14:04:41.529+1100 [DEBUG] core: request forwarding setup function
2023-02-20T14:04:41.529+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:41.529+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:41.529+1100 [DEBUG] core: leaving request forwarding setup function
2023-02-20T14:04:41.529+1100 [TRACE] activity: scanned existing logs: out=[]
2023-02-20T14:04:41.529+1100 [TRACE] activity: no intent log found
2023-02-20T14:04:41.529+1100 [INFO]  core: usage gauge collection is disabled
2023-02-20T14:04:41.529+1100 [INFO]  core: post-unseal setup complete
2023-02-20T14:04:41.530+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:41.530+1100 [INFO]  core: successful mount: namespace="" path=secret/ type=kv version=""
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=869106d2-3758-a34b-b7b2-b3ec3a1ecfe1
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=869106d2-3758-a34b-b7b2-b3ec3a1ecfe1
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: starting cluster listeners
2023-02-20T14:04:41.530+1100 [INFO]  core.cluster-listener.tcp: starting listener: listener_address=127.0.0.1:0
2023-02-20T14:04:41.530+1100 [INFO]  core.cluster-listener: serving cluster requests: cluster_listen_address=127.0.0.1:39401
2023-02-20T14:04:41.530+1100 [INFO]  core: vault is unsealed
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: cannot unseal, not enough keys: keys=1 threshold=3 nonce=070582d9-1956-239e-892d-5f356492ffab
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: cannot unseal, not enough keys: keys=2 threshold=3 nonce=070582d9-1956-239e-892d-5f356492ffab
2023-02-20T14:04:41.530+1100 [DEBUG] core: unseal key supplied: migrate=false
2023-02-20T14:04:41.530+1100 [DEBUG] core: starting cluster listeners
2023-02-20T14:04:41.530+1100 [INFO]  core.cluster-listener.tcp: starting listener: listener_address=127.0.0.1:0
2023-02-20T14:04:41.530+1100 [INFO]  core.cluster-listener: serving cluster requests: cluster_listen_address=127.0.0.1:41853
2023-02-20T14:04:41.530+1100 [INFO]  core: vault is unsealed
2023-02-20T14:04:41.530+1100 [INFO]  core: entering standby mode
2023-02-20T14:04:41.530+1100 [INFO]  core: entering standby mode
2023-02-20T14:04:43.531+1100 [TRACE] core: found new active node information, refreshing
2023-02-20T14:04:43.531+1100 [DEBUG] core: parsing information for new active node: active_cluster_addr=https://127.0.0.1:40031 active_redirect_addr=https://127.0.0.1:41207
2023-02-20T14:04:43.531+1100 [DEBUG] core: refreshing forwarding connection: clusterAddr=https://127.0.0.1:40031
2023-02-20T14:04:43.531+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:43.531+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:43.532+1100 [DEBUG] core: done refreshing forwarding connection: clusterAddr=https://127.0.0.1:40031
2023-02-20T14:04:43.532+1100 [TRACE] core: found new active node information, refreshing
2023-02-20T14:04:43.532+1100 [DEBUG] core: parsing information for new active node: active_cluster_addr=https://127.0.0.1:40031 active_redirect_addr=https://127.0.0.1:41207
2023-02-20T14:04:43.532+1100 [DEBUG] core: refreshing forwarding connection: clusterAddr=https://127.0.0.1:40031
2023-02-20T14:04:43.532+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:43.532+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:43.532+1100 [DEBUG] core.cluster-listener: creating rpc dialer: address=127.0.0.1:40031 alpn=req_fw_sb-act_v1 host=fw-dd8b793d-ebb6-b75c-3aa3-b377ece64937
2023-02-20T14:04:43.532+1100 [DEBUG] core: done refreshing forwarding connection: clusterAddr=https://127.0.0.1:40031
2023-02-20T14:04:43.533+1100 [DEBUG] core.cluster-listener: creating rpc dialer: address=127.0.0.1:40031 alpn=req_fw_sb-act_v1 host=fw-dd8b793d-ebb6-b75c-3aa3-b377ece64937
2023-02-20T14:04:43.534+1100 [DEBUG] core.cluster-listener: performing server cert lookup
2023-02-20T14:04:43.535+1100 [INFO]  core: enabled audit backend: path=noop/ type=noop
2023-02-20T14:04:43.537+1100 [INFO]  starting listener for test core: core=0 port=41207
2023-02-20T14:04:43.537+1100 [INFO]  starting listener for test core: core=1 port=45113
2023-02-20T14:04:43.537+1100 [INFO]  starting listener for test core: core=2 port=35051
2023-02-20T14:04:43.540+1100 [TRACE] core: adding write forwarded paths: paths=[]
2023-02-20T14:04:43.540+1100 [INFO]  core: enabled credential backend: path=oci/ type=oci version=""
2023-02-20T14:04:43.541+1100 [DEBUG] core.cluster-listener: performing client cert lookup
    oci_end_to_end_test.go:145: output: /tmp/auth.tokensink.test.50647656
2023-02-20T14:04:43.541+1100 [INFO]  sink.file: creating file sink
2023-02-20T14:04:43.541+1100 [TRACE] sink.file: enter write_token: path=/tmp/auth.tokensink.test.50647656
2023-02-20T14:04:43.541+1100 [TRACE] sink.file: exit write_token: path=/tmp/auth.tokensink.test.50647656
2023-02-20T14:04:43.542+1100 [INFO]  sink.file: file sink configured: path=/tmp/auth.tokensink.test.50647656 mode=-rw-r-----
2023-02-20T14:04:43.542+1100 [INFO]  sink.server: starting sink server
2023-02-20T14:04:43.542+1100 [INFO]  auth.handler: starting auth handler
2023-02-20T14:04:43.542+1100 [INFO]  auth.handler: authenticating
2023-02-20T14:04:43.542+1100 [TRACE] auth.oci: beginning authentication
2023-02-20T14:04:43.543+1100 [TRACE] auth.oci.auth_oci_05dd1969: 6c9c4319-6c2c-a6cb-5b56-142d111f857a: pathLoginUpdate roleName=test
2023-02-20T14:04:43.543+1100 [TRACE] auth.oci.auth_oci_05dd1969: 6c9c4319-6c2c-a6cb-5b56-142d111f857a: Method:=get targetUrl:=/v1/auth/oci/login/test
2023-02-20T14:04:43.543+1100 [DEBUG] core.request-forward: got request forwarding connection
2023-02-20T14:04:43.544+1100 [DEBUG] core.cluster-listener: performing server cert lookup
2023-02-20T14:04:43.546+1100 [DEBUG] core.cluster-listener: performing client cert lookup
2023-02-20T14:04:43.548+1100 [DEBUG] core.request-forward: got request forwarding connection
2023-02-20T14:04:44.281+1100 [TRACE] auth.oci.auth_oci_05dd1969: Authentication ok: Method:=get targetUrl:=/v1/auth/oci/login/test id=6c9c4319-6c2c-a6cb-5b56-142d111f857a
2023-02-20T14:04:44.397+1100 [TRACE] auth.oci.auth_oci_05dd1969: Login ok: Method:=get targetUrl:=/v1/auth/oci/login/test id=6c9c4319-6c2c-a6cb-5b56-142d111f857a
2023-02-20T14:04:44.397+1100 [DEBUG] identity: creating a new entity: alias="id:\"277fbcd5-173f-c695-7961-50729c517fd9\"  canonical_id:\"b319604d-6152-b402-7a18-b06b669142fd\"  mount_type:\"oci\"  mount_accessor:\"auth_oci_05dd1969\"  mount_path:\"auth/oci/\"  name:\"test\"  creation_time:{seconds:1676862284  nanos:397559776}  last_update_time:{seconds:1676862284  nanos:397559776}  namespace_id:\"root\"  local_bucket_key:\"packer/local-aliases/buckets/217\""
2023-02-20T14:04:44.400+1100 [INFO]  auth.handler: authentication successful, sending token to sinks
2023-02-20T14:04:44.401+1100 [INFO]  auth.handler: starting renewal process
2023-02-20T14:04:44.402+1100 [TRACE] sink.file: enter write_token: path=/tmp/auth.tokensink.test.50647656
2023-02-20T14:04:44.402+1100 [INFO]  sink.file: token written: path=/tmp/auth.tokensink.test.50647656
2023-02-20T14:04:44.402+1100 [TRACE] sink.file: exit write_token: path=/tmp/auth.tokensink.test.50647656
2023-02-20T14:04:45.542+1100 [INFO]  cleaning up vault cluster
2023-02-20T14:04:45.542+1100 [INFO]  sink.server: sink server stopped
2023-02-20T14:04:45.542+1100 [INFO]  core: stopping vault test core
2023-02-20T14:04:45.542+1100 [INFO]  core: stopping vault test core
2023-02-20T14:04:45.542+1100 [INFO]  core: listeners successfully shut down
2023-02-20T14:04:45.542+1100 [DEBUG] core: shutdown called
2023-02-20T14:04:45.542+1100 [INFO]  core: listeners successfully shut down
2023-02-20T14:04:45.542+1100 [DEBUG] core: shutdown called
2023-02-20T14:04:45.542+1100 [INFO]  core: stopping vault test core
2023-02-20T14:04:45.542+1100 [INFO]  core: marked as sealed
2023-02-20T14:04:45.543+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:45.543+1100 [INFO]  core: listeners successfully shut down
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutdown called
2023-02-20T14:04:45.543+1100 [INFO]  core: marked as sealed
2023-02-20T14:04:45.543+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:45.542+1100 [INFO]  auth.handler: shutdown triggered, stopping lifetime watcher
2023-02-20T14:04:45.543+1100 [INFO]  auth.handler: auth handler stopped
2023-02-20T14:04:45.543+1100 [DEBUG] core: forwarding: stopping heartbeating
2023-02-20T14:04:45.543+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:45.543+1100 [TRACE] auth.oci: shutdown triggered, stopping OCI auth handler
2023-02-20T14:04:45.542+1100 [INFO]  core: marked as sealed
2023-02-20T14:04:45.543+1100 [DEBUG] core: clearing forwarding clients
2023-02-20T14:04:45.543+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:45.543+1100 [DEBUG] core: finished triggering standbyStopCh for runStandby
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic key rotation checker
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic leader refresh
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic metrics
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down leader elections
2023-02-20T14:04:45.543+1100 [INFO]  core: pre-seal teardown starting
2023-02-20T14:04:45.543+1100 [DEBUG] core: finished triggering standbyStopCh for runStandby
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic key rotation checker
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic leader refresh
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic metrics
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down leader elections
2023-02-20T14:04:45.543+1100 [DEBUG] core: done clearing forwarding clients
2023-02-20T14:04:45.543+1100 [DEBUG] core: finished triggering standbyStopCh for runStandby
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic key rotation checker
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic leader refresh
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down periodic metrics
2023-02-20T14:04:45.543+1100 [DEBUG] core: shutting down leader elections
2023-02-20T14:04:45.543+1100 [DEBUG] core: runStandby done
2023-02-20T14:04:45.543+1100 [INFO]  core: stopping cluster listeners
2023-02-20T14:04:45.543+1100 [DEBUG] core: forwarding: stopping heartbeating
2023-02-20T14:04:45.543+1100 [INFO]  core.cluster-listener: forwarding rpc listeners stopped
2023-02-20T14:04:45.543+1100 [DEBUG] core: runStandby done
2023-02-20T14:04:45.543+1100 [INFO]  core: stopping cluster listeners
2023-02-20T14:04:45.543+1100 [INFO]  core.cluster-listener: forwarding rpc listeners stopped
2023-02-20T14:04:46.036+1100 [INFO]  core.cluster-listener: rpc listeners successfully shut down
2023-02-20T14:04:46.036+1100 [INFO]  core.cluster-listener: rpc listeners successfully shut down
2023-02-20T14:04:46.036+1100 [INFO]  core: cluster listeners successfully shut down
2023-02-20T14:04:46.036+1100 [INFO]  core: cluster listeners successfully shut down
2023-02-20T14:04:46.036+1100 [DEBUG] core: sealing barrier
2023-02-20T14:04:46.036+1100 [DEBUG] core: sealing barrier
2023-02-20T14:04:46.036+1100 [INFO]  core: vault is sealed
2023-02-20T14:04:46.036+1100 [INFO]  core: vault test core stopped
2023-02-20T14:04:46.036+1100 [INFO]  core: vault is sealed
2023-02-20T14:04:46.036+1100 [INFO]  core: vault test core stopped
2023-02-20T14:04:46.044+1100 [DEBUG] expiration: stop triggered
2023-02-20T14:04:46.044+1100 [TRACE] expiration.job-manager: terminating job manager...
2023-02-20T14:04:46.044+1100 [TRACE] expiration.job-manager: terminating dispatcher
2023-02-20T14:04:46.044+1100 [DEBUG] expiration: finished stopping
2023-02-20T14:04:46.044+1100 [INFO]  rollback: stopping rollback manager
2023-02-20T14:04:46.044+1100 [INFO]  core: pre-seal teardown complete
2023-02-20T14:04:46.044+1100 [DEBUG] core: runStandby done
2023-02-20T14:04:46.044+1100 [INFO]  core: stopping cluster listeners
2023-02-20T14:04:46.044+1100 [INFO]  core.cluster-listener: forwarding rpc listeners stopped
2023-02-20T14:04:46.053+1100 [INFO]  core.cluster-listener: rpc listeners successfully shut down
2023-02-20T14:04:46.053+1100 [INFO]  core: cluster listeners successfully shut down
2023-02-20T14:04:46.053+1100 [DEBUG] core: sealing barrier
2023-02-20T14:04:46.053+1100 [INFO]  core: vault is sealed
2023-02-20T14:04:46.053+1100 [INFO]  core: vault test core stopped
--- PASS: TestOCIEndToEnd (5.60s)
PASS
ok      github.com/hashicorp/vault/command/agent

Closes #19195

@F21 F21 requested a review from yhyakuna as a code owner February 20, 2023 03:24
@F21 F21 force-pushed the vault-agent-oci-auth branch from ad2abbd to 5051c67 Compare February 20, 2023 04:02
@F21 F21 force-pushed the vault-agent-oci-auth branch from 5051c67 to 136f21d Compare March 5, 2023 21:44
@yhyakuna yhyakuna requested a review from VioletHynes March 9, 2023 04:29
@VioletHynes
Copy link
Contributor

Hi there! This looks great! Sorry for the delay in reviewing, I only just spotted this today. I'm going to get a second set of eyes on this for the OCI-specific stuff that I'm less familiar with, but in principle I like this PR a lot, thank you!

Copy link
Contributor

@VioletHynes VioletHynes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Submitted some requested changes - I'm also going to get someone who knows a bit more about OCI to review this, too, and the merge will also be contingent on their approval. Thanks again for the PR!

command/agent/auth/oci/oci.go Outdated Show resolved Hide resolved
command/agent/auth/oci/oci.go Show resolved Hide resolved
website/content/docs/agent/autoauth/methods/oci.mdx Outdated Show resolved Hide resolved
@F21 F21 force-pushed the vault-agent-oci-auth branch from 136f21d to 22933f3 Compare March 9, 2023 21:45
@F21
Copy link
Contributor Author

F21 commented Mar 9, 2023

Thanks for the review @VioletHynes , I left some comments to ask for clarification for a few things.

@swenson swenson self-requested a review March 9, 2023 23:27
@F21
Copy link
Contributor Author

F21 commented Mar 11, 2023

Thanks for the review @VioletHynes ! I pushed a commit to use ParseDurationSecond() to parse credential_poll_interval

@F21 F21 force-pushed the vault-agent-oci-auth branch from b50cf06 to 5384508 Compare March 11, 2023 08:36
Copy link
Contributor

@VioletHynes VioletHynes left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving - though I'll wait for @swenson 's approval before merging, due to his OCI expertise.

Thanks a bunch for this PR!

Copy link
Contributor

@swenson swenson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM; I haven't been able to test on OCI due to some account issues, but I will do so when I can next. I don't want that to block you though.

@F21
Copy link
Contributor Author

F21 commented Mar 14, 2023

Thanks for reviewing @VioletHynes and @swenson ! I look forward to this being released 🚢 .

@VioletHynes
Copy link
Contributor

Thanks a bunch for this PR. Great work, and thanks for being patient with reviews/approvals :)

Merging 🚢

@VioletHynes VioletHynes merged commit 789406c into hashicorp:main Mar 15, 2023
@F21 F21 deleted the vault-agent-oci-auth branch March 15, 2023 21:26
raymonstah pushed a commit that referenced this pull request Mar 17, 2023
* Add Oracle Cloud auth to the Vault Agent

* Use ParseDurationSecond to parse credential_poll_interval

* Use os.UserHomeDir()
@swenson
Copy link
Contributor

swenson commented Mar 24, 2023

I finally got my OCI access worked out, and verified that this works as expected. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support OCI (Oracle Cloud Infrastructure) for Vault Agent auto-auth
3 participants